Skip to content

fix: add multi-turn output type for openai component #57

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
Jul 17, 2025

Conversation

iambenkay
Copy link
Contributor

@iambenkay iambenkay commented Jul 1, 2025

This introduces a fix to the Open AI component for multi-turn conversations. The main issue was caused by a mismatch of the request types expected by OpenAI Responses API.

Working solution

multiturn.mov

Test 8 passes now.

/claim #53
close #53

@iambenkay iambenkay force-pushed the multiturn-openai-bug branch from 8674ccf to d4c9df5 Compare July 1, 2025 12:15
@iambenkay iambenkay force-pushed the multiturn-openai-bug branch from fb07943 to 668a7c5 Compare July 1, 2025 13:34
@iambenkay iambenkay force-pushed the multiturn-openai-bug branch from 7b9d203 to 4b3b8d3 Compare July 3, 2025 17:17
@vigoo vigoo merged commit bd637a0 into golemcloud:main Jul 17, 2025
5 checks passed
@iambenkay iambenkay deleted the multiturn-openai-bug branch July 17, 2025 17:23
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

BUG: llm/openai fails with 400 error in multi-turn conversations
3 participants